The Theory of Neural Network.s: the Hebb Rule and Beyond

نویسنده

  • H. Sompolinsky
چکیده

Recent studies of the statistical mechanics of neural network models of associative memory are reviewed. The paper discusses models which have an energy function but depart from the simple Hebb rule. This includes networks with static synaptic noise, dilute networks and synapses that are nonlinear functions of the Hebb ru1e (e.g., clipped networks). The properties of networks that ernp loy the pro jection method are reviewed. I: Introduction A. The HODfield i'1odel !'-1odelsof neural networks which exhibit features of associative memory have been the subject of intense theoreticalactivity.l_B Following Hopfield's work, 1 attention focused recently on networks that possess a global energy function. Assuming for simplicity a syslem of N two-state neurons, their energy function is given by

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Hebb Rule for Synaptic Plasticity: Algorithms and Implementations

In 1949 Donald Hebb published "The Organization of Behavior," in which he introduced several hypotheses about the neural substrate of learning and memory, including the Hebb learning rule or Hebb synapse. At that time very little was known about neural mechanisms of plasticity at the molecular and cellular levels. The primary data on which Hebb formulated his hypotheses was Golgi material, prov...

متن کامل

INTEGRATED ADAPTIVE FUZZY CLUSTERING (IAFC) NEURAL NETWORKS USING FUZZY LEARNING RULES

The proposed IAFC neural networks have both stability and plasticity because theyuse a control structure similar to that of the ART-1(Adaptive Resonance Theory) neural network.The unsupervised IAFC neural network is the unsupervised neural network which uses the fuzzyleaky learning rule. This fuzzy leaky learning rule controls the updating amounts by fuzzymembership values. The supervised IAFC ...

متن کامل

Active Learning in Recurrent Neural Networks Facilitated by a Hebb-like Learning Rule with Memory

We demonstrate in this article that a Hebb-like learning rule with memory paves the way for active learning in the context of recurrent neural networks. We compare active with passive learning and a Hebb-like learning rule with and without memory for the problem of timing to be learned by the neural network. Moreover, we study the influence of the topology of the recurrent neural network. Our r...

متن کامل

A novel stochastic Hebb-like learning rule for neural networks

We present a novel stochastic Hebb-like learning rule for neural networks. This learning rule is stochastic with respect to the selection of the time points when a synaptic modification is induced by preand postsynaptic activation. Moreover, the learning rule does not only affect the synapse between preand postsynaptic neuron which is called homosynaptic plasticity but also on further remote sy...

متن کامل

Modeling Hebb Learning Rule for Unsupervised Learning

This paper presents to model the Hebb learning rule and proposes a neuron learning machine (NLM). Hebb learning rule describes the plasticity of the connection between presynaptic and postsynaptic neurons and it is unsupervised itself. It formulates the updating gradient of the connecting weight in artificial neural networks. In this paper, we construct an objective function via modeling the He...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1987